Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Formal verification is a gold standard for building reliable computer systems. Certified systems in particular come with a formal specification, and a proof of correctness which can easily be checked by a third party. Unfortunately, verifying large-scale, heterogeneous systems remains out of reach of current techniques. Addressing this challenge will require the use of compositional methods capable of accommodating and interfacing a range of program verification and certified compilation techniques. In principle, compositional semantics could play a role in enabling this kind of flexibility, but in practice existing tools tend to rely on simple and specialized operational models which are difficult to interface with one another. To tackle this issue, we present a compositional semantics framework which can accommodate a broad range of verification techniques. Its core is a three-dimensional algebra of refinement which operates across program modules, levels of abstraction, and components of the system's state. Our framework is mechanized in the Coq proof assistant and we showcase its capabilities with multiple use cases.more » « less
-
Verified compilation of open modules (i.e., modules whose functionality depends on other modules) provides a foundation for end-to-end verification of modular programs ubiquitous in contemporary software. However, despite intensive investigation in this topic for decades,the proposed approaches are still difficult to use in practice as they rely on assumptions about the internal working of compilers which make it difficult for external users to apply the verification results. We propose an approach to verified compositional compilation without such assumptions in the setting of verifying compilation of heterogeneous modules written in first-order languages supporting global memory and pointers. Our approach is based on the memory model of CompCert and a new discovery that a Kripke relation with a notion of memory protection can serve as a uniform and composable semantic interface for the compiler passes. By absorbing the rely-guarantee conditions on memory evolution for all compiler passes into this Kripke Memory Relation and by piggybacking requirements on compiler optimizations onto it, we get compositional correctness theorems for realistic optimizing compilers as refinements that directly relate native semantics of open modules and that are ignorant of intermediate compilation processes. Such direct refinements support all the compositionality and adequacy properties essential for verified compilation of open modules. We have applied this approach to the full compilation chain of CompCert with its Clight source language and demonstrated that our compiler correctness theorem is open to composition and intuitive to use with reduced verification complexity through end-to-end verification of non-trivial heterogeneous modules that may freely invoke each other (e.g.,mutually recursively).more » « less
-
Memory models play an important role in verified compilation of imperative programming languages. A representative one is the block-based memory model of CompCert---the state-of-the-art verified C compiler. Despite its success, the abstraction over memory space provided by CompCert's memory model is still primitive and inflexible. In essence, it uses a fixed representation for identifying memory blocks in a global memory space and uses a globally shared state for distinguishing between used and unused blocks. Therefore, any reasoning about memory must work uniformly for the global memory; it is impossible to individually reason about different sub-regions of memory (i.e., the stack and global definitions). This not only incurs unnecessary complexity in compiler verification, but also poses significant difficulty for supporting verified compilation of open or concurrent programs which need to work with contextual memory, as manifested in many previous extensions of CompCert. To remove the above limitations, we propose an enhancement to the block-based memory model based on nominal techniques; we call it the nominal memory model. By adopting the key concepts of nominal techniques such as atomic names and supports to model the memory space, we are able to 1) generalize the representation of memory blocks to any types satisfying the properties of atomic names and 2) remove the global constraints for managing memory blocks, enabling flexible memory structures for open and concurrent programs. To demonstrate the effectiveness of the nominal memory model, we develop a series of extensions of CompCert based on it. These extensions show that the nominal memory model 1) supports a general framework for verified compilation of C programs, 2) enables intuitive reasoning of compiler transformations on partial memory; and 3) enables modular reasoning about programs working with contextual memory. We also demonstrate that these extensions require limited changes to the original CompCert, making the verification techniques based on the nominal memory model easy to adopt.more » « less
-
Carbon dots (CDs) have received extensive attention in the last decade for their excellent optical, chemical and biological properties. In recent years, CD composites have also received significant attention due to their ability to improve the intrinsic properties and expand the application scope of CDs. In this article, the synthesis processes of four types of CD composites (metal–CD, nonmetallic inorganics–CD, and organics–CD as well as multi-components—CD composites) are systematically summarized first. Then the recent advancements in the bioapplications (bioimaging, drug delivery and biosensing) of these composites are also highlighted and discussed. Last, the current challenges and future trends of CD composites in biomedical fields are discussed.more » « less
-
Abstract Where does the carbon released by burning fossil fuels go? Currently, ocean and land systems remove about half of the CO 2 emitted by human activities; the remainder stays in the atmosphere. These removal processes are sensitive to feedbacks in the energy, carbon, and water cycles that will change in the future. Observing how much carbon is taken up on land through photosynthesis is complicated because carbon is simultaneously respired by plants, animals, and microbes. Global observations from satellites and air samples suggest that natural ecosystems take up about as much CO 2 as they emit. To match the data, our land models generate imaginary Earths where carbon uptake and respiration are roughly balanced, but the absolute quantities of carbon being exchanged vary widely. Getting the magnitude of the flux is essential to make sure our models are capturing the right pattern for the right reasons. Combining two cutting-edge tools, carbonyl sulfide (OCS) and solar-induced fluorescence (SIF), will help develop an independent answer of how much carbon is being taken up by global ecosystems. Photosynthesis requires CO 2 , light, and water. OCS provides a spatially and temporally integrated picture of the “front door” of photosynthesis, proportional to CO 2 uptake and water loss through plant stomata. SIF provides a high-resolution snapshot of the “side door,” scaling with the light captured by leaves. These two independent pieces of information help us understand plant water and carbon exchange. A coordinated effort to generate SIF and OCS data through satellite, airborne, and ground observations will improve our process-based models to predict how these cycles will change in the future.more » « less
An official website of the United States government

Full Text Available